Random naive Bayes extends the Naive Bayes classifier by adopting the random forest principles: random input selection, bagging (i.e. bootstrap aggregating), and random feature selection ([1]).
Contents |
The naive Bayes classifier is a probabilistic classifier simplifying Bayes' theorem by naively assuming class conditional independence. Although this assumption leads to biased posterior probabilities, the ordered probabilities of Naive Bayes result in a classification performance comparable to that of classification trees and neural networks. [2] Notwithstanding Naive Bayes' popularity due to its simplicity combined with high accuracy and speed, its conditional independence assumption rarely holds. There are mainly two approaches to alleviate this naivety:
Random Naive Bayes adopts the first approach by randomly selecting a subset of attributes in which attributes are assumed to be conditionally independent. Naive Bayes' performance might benefit from this random feature selection. Analogous to AODE, Random Naive Bayes builds an ensemble, but unlike AODE, the ensemble combines zero-dependence classifiers.
Generalizing Random Forest to Naive Bayes, Random Naive Bayes (Random NB), is a bagged classifier combining a forest of B Naive Bayes. Each bth Naive Bayes is estimated on a bootstrap sample Sb with m randomly selected features. To classify an observation put the input vector down the B Naive Bayes in the forest. Each Naive Bayes generates posterior class probabilities. Unlike Random Forest, the predicted class of the ensemble is assessed by adjusted majority voting rather than majority voting, as each bth Naive Bayes delivers continuous posterior probabilities. Similar to Random Forests, the importance of each feature is estimated on the out-of-bag (oob) data.